Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

A New Estimator of Entropy

In this paper we propose an estimator of the entropy of a continuous random variable. The estimator is obtained by modifying the estimator proposed by Vasicek (1976). Consistency of estimator is proved, and comparisons are made with Vasicek’s estimator (1976), van Es’s estimator (1992), Ebrahimi et al.’s estimator (1994) and Correa’s estimator (1995). The results indicate that the proposed esti...

متن کامل

Estimation of the Entropy Rate of ErgodicMarkov Chains

In this paper an approximation for entropy rate of an ergodic Markov chain via sample path simulation is calculated. Although there is an explicit form of the entropy rate here, the exact computational method is laborious to apply. It is demonstrated that the estimated entropy rate of Markov chain via sample path not only converges to the correct entropy rate but also does it exponential...

متن کامل

the role of application of dynamic assessment approach in improvement of iranian efl writing performance at different language proficiency levels

the present study sought to investigate the role of dynamic assessment (da) in improvement of iranian efl writing performance at different language proficiency levels. to this end, after conducting the quick placement test, 60 iranian efl learners were assigned to two groups with different language proficiency levels. in both groups each participant wrote two compositions, one before and one af...

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Entropy

سال: 2018

ISSN: 1099-4300

DOI: 10.3390/e20110839